Goto

Collaborating Authors

 piecewise kl


Projecting Markov Random Field Parameters for Fast Mixing

Xianghang Liu, Justin Domke

Neural Information Processing Systems

The flaw in practice is that it can take a large and/or unknown amount of time to converge to the stationary distribution. This paper gives sufficient conditions to guarantee that univariate Gibbs sampling on Markov Random Fields (MRFs) will be fast mixing, in a precise sense. Further, an algorithm is given to project onto this set of fast-mixing parameters in the Euclidean norm. Following recent work, we give an example use of this to project in various divergence measures, comparingunivariatemarginals obtained by sampling after projection to common variational methodsandGibbs sampling on the original parameters.


Projecting Ising Model Parameters for Fast Mixing

Neural Information Processing Systems

Inference in general Ising models is difficult, due to high treewidth making treebased algorithms intractable. Moreover, when interactionsarestrong,Gibbssampling may take exponential time to converge to the stationary distribution. We present an algorithm to project Ising model parameters onto aparametersetthat is guaranteed to be fast mixing, under several divergences. We find that Gibbs sampling using the projected parameters is more accurate than with the original parameters when interaction strengths are strong and when limited time is available for sampling.


Projecting Markov Random Field Parameters for Fast Mixing

Neural Information Processing Systems

The flaw in practice is that it can take a large and/or unknown amount of time to converge to the stationary distribution. This paper gives sufficient conditions to guarantee that univariate Gibbs sampling on Markov Random Fields (MRFs) will be fast mixing, in a precise sense. Further, an algorithm is given to project onto this set of fast-mixing parameters in the Euclidean norm. Following recent work, we give an example use of this to project in various divergence measures, comparingunivariatemarginals obtained by sampling after projection to common variational methodsandGibbs sampling on the original parameters.


Projecting Markov Random Field Parameters for Fast Mixing

Liu, Xianghang, Domke, Justin

Neural Information Processing Systems

Markov chain Monte Carlo (MCMC) algorithms are simple and extremely powerful techniques to sample from almost arbitrary distributions. The flaw in practice is that it can take a large and/or unknown amount of time to converge to the stationary distribution. This paper gives sufficient conditions to guarantee that univariate Gibbs sampling on Markov Random Fields (MRFs) will be fast mixing, in a precise sense. Further, an algorithm is given to project onto this set of fast-mixing parameters in the Euclidean norm. Following recent work, we give an example use of this to project in various divergence measures, comparing of univariate marginals obtained by sampling after projection to common variational methods and Gibbs sampling on the original parameters.


Projecting Markov Random Field Parameters for Fast Mixing

Liu, Xianghang, Domke, Justin

arXiv.org Machine Learning

Markov chain Monte Carlo (MCMC) algorithms are simple and extremely powerful techniques to sample from almost arbitrary distributions. The flaw in practice is that it can take a large and/or unknown amount of time to converge to the stationary distribution. This paper gives sufficient conditions to guarantee that univariate Gibbs sampling on Markov Random Fields (MRFs) will be fast mixing, in a precise sense. Further, an algorithm is given to project onto this set of fast-mixing parameters in the Euclidean norm. Following recent work, we give an example use of this to project in various divergence measures, comparing univariate marginals obtained by sampling after projection to common variational methods and Gibbs sampling on the original parameters.